Inference with Discriminative Posterior

نویسندگان

  • Jarkko Salojärvi
  • Kai Puolamäki
  • Eerika Savia
  • Samuel Kaski
چکیده

We study Bayesian discriminative inference given a model family p(c,x, θ) that is assumed to contain all our prior information but still known to be incorrect. This falls in between “standard” Bayesian generative modeling and Bayesian regression, where the margin p(x, θ) is known to be uninformative about p(c|x, θ). We give an axiomatic proof that discriminative posterior is consistent for conditional inference; using the discriminative posterior is standard practice in classical Bayesian regression, but we show that it is theoretically justified for model families of joint densities as well. A practical benefit compared to Bayesian regression is that the standard methods of handling missing values in generative modeling can be extended into discriminative inference, which is useful if the amount of data is small. Compared to standard generative modeling, discriminative posterior results in better conditional inference if the model family is incorrect. If the model family contains also the true model, the discriminative posterior gives the same result as standard Bayesian generative modeling. Practical computation is done with Markov chain Monte Carlo.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ar X iv : 0 80 7 . 34 70 v 2 [ st at . M L ] 1 8 N ov 2 00 8 Inference with Discriminative Posterior

We study Bayesian discriminative inference given a model family p(c,x, θ) that is assumed to contain all our prior information but still known to be incorrect. This falls in between “standard” Bayesian generative modeling and Bayesian regression, where the margin p(x, θ) is known to be uninformative about p(c|x, θ). We give an axiomatic proof that discriminative posterior is consistent for cond...

متن کامل

Learning Inference Models for Computer Vision

Computer vision can be understood as the ability to perform inference on image data. Breakthroughs in computer vision technology are often marked by advances in inference techniques, as even the model design is often dictated by the complexity of inference in them. This thesis proposes learning based inference schemes and demonstrates applications in computer vision. We propose techniques for i...

متن کامل

Data-driven Sequential Monte Carlo in Probabilistic Programming

Most of Markov Chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) algorithms in existing probabilistic programming systems suboptimally use only model priors as proposal distributions. In this work, we describe an approach for training a discriminative model, namely a neural network, in order to approximate the optimal proposal by using posterior estimates from previous runs of inference...

متن کامل

The informed sampler: A discriminative approach to Bayesian inference in generative computer vision models

Computer vision is hard because of a large variability in lighting, shape, and texture; in addition the image signal is non-additive due to occlusion. Generative models promised to account for this variability by accurately modelling the image formation process as a function of latent variables with prior beliefs. Bayesian posterior inference could then, in principle, explain the observation. W...

متن کامل

Non-parametric Structured Output Networks

Deep neural networks (DNNs) and probabilistic graphical models (PGMs) are the two main tools for statistical modeling. While DNNs provide the ability to model rich and complex relationships between input and output variables, PGMs provide the ability to encode dependencies among the output variables themselves. End-to-end training methods for models with structured graphical dependencies on top...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008